TikTok is introducing a new set of parental controls to its platform to users worldwide, including in the U.S. The features, collectively referred to as “Family Pairing,” will allow parents to set controls on Screen Time Management, Restricted Mode and Direct Messages for their teen users. It also will now disable direct messaging for users under the age of 16 in all markets. A similar set of features was launched in the U.K. in February, designed with European laws and regulations in mind.
In that market, the features were called “Family Safety Mode.”
Today is the official introduction to “Family Pairing,” but TikTok says the worldwide rollout will take place over the “coming weeks.”
To use the new controls, parents of a teenage user age 13 and older will be able to link their account to their child’s, which requires the parent to set up their own TikTok account. This will allow the parent to set controls on how long their child is able to use the TikTok app, turn on or off who the teen can direct message with and they can opt to turn on TikTok’s “restricted” mode for the child’s account in order to limit inappropriate content.
The latter is not a well-explained feature. But for an app of TikTok’s scale, it’s likely based in large part on users flagging inappropriate videos they come across. Parents should be aware, then, that this is not equivalent to setting parental controls on a video streaming app, like Netflix, or restricting what a child can download from the App Store on their phone. In other words, some inappropriate content or more adult material could slip through.
Both Screen Time Management and Restricted Mode are existing controls that TikTok users can set for themselves via the app’s Digital Wellbeing section. But with Family Pairing, the parent will be able to set these controls for their child, instead of relying on the teen to do it for themselves.
TikTok also already offered a number of controls on Direct Messaging before today, which allow users to restrict messages to only approved followers, restrict the audience or disable direct messages altogether. TikTok also blocks images and videos in messages to cut down on other issues, as well.
But with Family Pairing, parents can choose to what extent teens can message privately on the platform, if at all.
And in a move that will likely enrage teens, TikTok has now decided to automatically disable Direct Messages for any registered accounts for those under the age of 16. (Prepare to see a lot more activity and private conversations taking place in the TikTok comments section!) This change goes live on April 30.
The changes give parents far more control over their child’s use of TikTok compared with any other social media app on the market today, outside of those designed exclusively with families and children in mind. However, the parental controls are only a subset of the controls users can opt to set for themselves. For example, users can choose to make their accounts private, turn off comments and control who can duet with them, among other things.
But the options may relieve some parents’ stress about how addictive the TikTok app has become. Teen users are spending significant amounts of time on the short video app — so much that TikTok itself even launched its own in-app PSA that encourages users to “take a break” from their phone.
TikTok offers other resources for parents, as well, including educational safety videos and parental guides.
It’s an interesting decision on TikTok’s part to launch screen time-limiting features and other restrictions amid a global pandemic, when teens are stuck at home with nothing much to do but watch videos, chat and play games. But with families at home together, there may be no better time than now to have a conversation about how much social media is too much.
“More than ever, families are turning to internet platforms like TikTok to stay entertained, informed, and connected. That was, of course, happening before COVID-19, but it has only accelerated since the outbreak began and social distancing brought families closer together,” writes TikTok director of Trust & Safety, Jeff Collins, in an announcement. “The embrace of platforms like ours is providing families with joint tools to express their creativity, share their stories, and show support for their communities. At the same time, they are often learning to navigate the digital landscape together and focused on ensuring a safe experience,” he said.
The changes follow increased scrutiny by government regulators of TikTok, owned by Beijing-based ByteDance, and the 2019 fine of $5.7 million leveraged against Musical.ly (which had been acquired by ByteDance) by the FTC for violation of U.S. children’s privacy law COPPA.
TikTok has responded to these concerns in a variety of ways, including the introduction of the TikTok Content Advisory Council; the release of new Community Guidelines; publication of its first Transparency Report; the hiring of global General Counsel; expansion of its Trust & Safety hubs in the U.S., Ireland and Singapore; and its launch of a Transparency Center open to outside experts who want to review its moderation practices.